A Bayes Risk Minimization Machine for Example-Dependent Cost Classification
نویسندگان
چکیده
A new method for example-dependent cost (EDC) classification is proposed. The constitutes an extension of a recently introduced training algorithm neural networks. surrogate function estimate the Bayesian risk, where estimates conditional probabilities each class are defined in terms 1-D Parzen window estimator output (discriminative) This probability density modeled with objective allowing easy minimization sampled version Bayes risk. included definition risk not explicitly estimated, but minimized by gradient-descent algorithm. proposed has been evaluated using linear classifiers and networks, both shallow (a single hidden layer) deep (multiple layers) architectures. experimental results show potential flexibility method, which can handle EDC under imbalanced data situations that commonly appear this kind problems.
منابع مشابه
Surrogate losses for cost-sensitive classification with example-dependent costs
We study surrogate losses in the context of cost-sensitive classification with example-dependent costs, a problem also known as regression level set estimation. We give sufficient conditions on the surrogate loss for the existence of a surrogate regret bound. Such bounds imply that as the surrogate risk tends to its optimal value, so too does the expected misclassification cost. These kinds of ...
متن کاملRisk Classification with an Adaptive Naive Bayes Kernel Machine Model.
Genetic studies of complex traits have uncovered only a small number of risk markers explaining a small fraction of heritability and adding little improvement to disease risk prediction. Standard single marker methods may lack power in selecting informative markers or estimating effects. Most existing methods also typically do not account for non-linearity. Identifying markers with weak signals...
متن کاملBayes Risk Minimization in Natural Language Parsing
Candidate selection from n-best lists is a widely used approach in natural language parsing. Instead of attempting to select the most probable candidate, we focus on prediction of a new structure which minimizes an approximation to Bayes risk. Our approach does not place any restrictions on the probabilistic model used. We show how this approach can be applied in both dependency and constituent...
متن کاملBayes risk minimization using metric loss functions
In this work, fundamental properties of Bayes decision rule using general loss functions are derived analytically and are verified experimentally for automatic speech recognition. It is shown that, for maximum posterior probabilities larger than 1/2, Bayes decision rule with a metric loss function always decides on the posterior maximizing class independent of the specific choice of (metric) lo...
متن کاملAdversarial Multiclass Classification: A Risk Minimization Perspective
Recently proposed adversarial classification methods have shown promising results for cost sensitive and multivariate losses. In contrast with empirical risk minimization (ERM) methods, which use convex surrogate losses to approximate the desired non-convex target loss function, adversarial methods minimize non-convex losses by treating the properties of the training data as being uncertain and...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: IEEE transactions on cybernetics
سال: 2021
ISSN: ['2168-2275', '2168-2267']
DOI: https://doi.org/10.1109/tcyb.2019.2913572